Comp760, Lecture 1: Basic Functional Analysis
نویسنده
چکیده
In 1906 Jensen founded the theory of convex functions. This enabled him to prove a considerable extension of the AM-GM inequality. Recall that a subset D of a real vector space is called convex if every convex linear combination of a pair of points of D is in D. Equivalently, if x, y ∈ D, then tx+ (1− t)y ∈ D for every t ∈ [0, 1]. Given a convex set D, a function f : D → R is called convex if for every t ≤ [0, 1], f(tx+ (1− t)y) ≤ tf(x) + (1− t)f(y). If the inequality is strict for every t ∈ (0, 1), then the function is called strictly convex. Trivially f is convex if and only if {(x, y) ∈ D×R : y ≥ f(x)} is convex. Also note that f : D → R is convex if and only if fxy : [x, y]→ R defined as fxy : tx+ (1− t)y 7→ tf(x) + (1− t)f(y) is convex. By Rolle’s theorem if fxy is twice differentiable, then this is equivalent to f ′′ xy ≥ 0. A function f : D → R is concave if −f is convex. The following important inequality is often called Jensen’s inequality.
منابع مشابه
Comp760, Summary of Lecture 20
The material for today’s lecture is from [BGPW13]. Determining the 0-error internal information cost of the AND function is considerably more difficult that determining its external information cost. At the first glance it might seem surprising that one can do something different from the protocol that we discussed in the external case. First we state a protocol that is optimal for the case of ...
متن کاملComp760, Summary of Lecture 15
We can think of these as communication tasks. Note that all the above definitions require something about the distribution of (X,Y, π(X,Y )). In the following definition we attempt to formally define a communication task. This is going to be useful when we discuss the direct sum theorems. However, the formalism of Definition 1 is not very essential for our purpose and the natural intuition that...
متن کاملComp760, Lectures 12-13: Reverse Bonami-beckner and Expansion, Gaussian Space
In this lecture we are going to address a question related to expansion. We choose an element in a product space and change each coordinate with a small probability. How large is the probability that starting in a given small set A, the new point lands in another small set B? We are going to prove a lower bound on this probability that depends on the relative densities of such sets. To this end...
متن کاملComp760, Summary of Lecture 17
In this lecture we will discuss how to compress a protocol with low information cost, but possibly very high communication cost, to a protocol with low communication cost. It will be important to understand how much information is revealed at every step of the protocol. Thus we will take a closer look at the protocol tree, and we shall try to break the information cost into an expression in ter...
متن کامل